Convergence results for projected line-search methods on varieties of low-rank matrices via Łojasiewicz inequality
نویسندگان
چکیده
The aim of this paper is to derive convergence results for projected line-search methods on the real-algebraic variety M≤k of real m× n matrices of rank at most k. Such methods extend Riemannian optimization methods, which are successfully used on the smooth manifold Mk of rank-k matrices, to its closure by taking steps along gradient-related directions in the tangent cone, and afterwards projecting back to M≤k. Considering such a method circumvents the difficulties which arise from the nonclosedness and the unbounded curvature of Mk. The pointwise convergence is obtained for real-analytic functions on the basis of a Lojasiewicz inequality for the projection of the antigradient to the tangent cone. If the derived limit point lies on the smooth part of M≤k, i.e. in Mk, this boils down to more or less known results, but with the benefit that asymptotic convergence rate estimates (for specific step-sizes) can be obtained without an a priori curvature bound, simply from the fact that the limit lies on a smooth manifold. At the same time, one can give a convincing justification for assuming critical points to lie in Mk: if X is a critical point of f on M≤k, then either X has rank k, or ∇f(X) = 0.
منابع مشابه
Convergence results for projected line-search methods on varieties of low-rank matrices via \L{}ojasiewicz inequality
As an intial step towards low-rank optimization algorithms using hierarchical tensors, the aim of this paper is to derive convergence results for projected line-search methods on the real-algebraic variety M≤k of real m × n matrices of rank at most k. Such methods extend successfully used Riemannian optimization methods on the smooth manifold Mk of rank-k matrices to its closure by taking steps...
متن کاملQuadratic Optimization with Orthogonality Constraints: Explicit Łojasiewicz Exponent and Linear Convergence of Line-Search Methods
A fundamental class of matrix optimization problems that arise in many areas of science and engineering is that of quadratic optimization with orthogonality constraints. Such problems can be solved using line-search methods on the Stiefel manifold, which are known to converge globally under mild conditions. To determine the convergence rates of these methods, we give an explicit estimate of the...
متن کاملA note on the convergence of nonconvex line search
In this note, we consider the line search for a class of abstract nonconvex algorithm which have been deeply studied in the Kurdyka-Łojasiewicz theory. We provide a weak convergence result of the line search in general. When the objective function satisfies the Kurdyka-Łojasiewicz property and some certain assumption, a global convergence result can be derived. An application is presented for t...
متن کاملLinear Convergence of Proximal-Gradient Methods under the Polyak-Łojasiewicz Condition
In 1963, Polyak proposed a simple condition that is sufficient to show that gradient descent has a global linear convergence rate. This condition is a special case of the Łojasiewicz inequality proposed in the same year, and it does not require strong-convexity (or even convexity). In this work, we show that this much-older Polyak-Łojasiewicz (PL) inequality is actually weaker than the four mai...
متن کاملConvergence of Non-smooth Descent Methods Using the Kurdyka-Łojasiewicz Inequality
We investigate the convergence of subgradient-oriented descent methods in non-smooth non-convex optimization. We prove convergence in the sense of subsequences for functions with a strict standard model, and we show that convergence to a single critical point may be guaranteed if the Kurdyka–Łojasiewicz inequality is satisfied. We show, by way of an example, that the Kurdyka–Łojasiewicz inequal...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- SIAM Journal on Optimization
دوره 25 شماره
صفحات -
تاریخ انتشار 2015